15 research outputs found

    Memory Models for Incremental Learning Architectures

    Get PDF
    Losing V. Memory Models for Incremental Learning Architectures. Bielefeld: Universität Bielefeld; 2019.Technological advancement leads constantly to an exponential growth of generated data in basically every domain, drastically increasing the burden of data storage and maintenance. Most of the data is instantaneously extracted and available in form of endless streams that contain the most current information. Machine learning methods constitute one fundamental way of processing such data in an automatic way, as they generate models that capture the processes behind the data. They are omnipresent in our everyday life as their applications include personalized advertising, recommendations, fraud detection, surveillance, credit ratings, high-speed trading and smart-home devices. Thereby, batch learning, denoting the offline construction of a static model based on large datasets, is the predominant scheme. However, it is increasingly unfit to deal with the accumulating masses of data in given time and in particularly its static nature cannot handle changing patterns. In contrast, incremental learning constitutes one attractive alternative that is a very natural fit for the current demands. Its dynamic adaptation allows continuous processing of data streams, without the necessity to store all data from the past, and results in always up-to-date models, even able to perform in non-stationary environments. In this thesis, we will tackle crucial research questions in the domain of incremental learning by contributing new algorithms or significantly extending existing ones. Thereby, we consider stationary and non-stationary environments and present multiple real-world applications that showcase merits of the methods as well as their versatility. The main contributions are the following: One novel approach that addresses the question of how to extend a model for prototype-based algorithms based on cost minimization. We propose local split-time prediction for incremental decision trees to mitigate the trade-off between adaptation speed versus model complexity and run time. An extensive survey of the strengths and weaknesses of state-of-the-art methods that provides guidance for choosing a suitable algorithm for a given task. One new approach to extract valuable information about the type of change in a dataset. We contribute a biologically inspired architecture, able to handle different types of drift using dedicated memories that are kept consistent. Application of the novel methods within three diverse real-world tasks, highlighting their robustness and versatility. Investigation of personalized online models in the context of two real-world applications

    Personalized Maneuver Prediction at Intersections

    Get PDF
    Losing V, Hammer B, Wersing H. Personalized Maneuver Prediction at Intersections. Presented at the IEEE Intelligent Transportation Systems Conference 2017, Yokohama

    Choosing the Best Algorithm for an Incremental On-line Learning Task

    Get PDF
    Losing V, Hammer B, Wersing H. Choosing the Best Algorithm for an Incremental On-line Learning Task. Presented at the European Symposium on Artificial Neural Networks, Brügge.Recently, incremental and on-line learning gained more attention especially in the context of big data and learning from data streams, conflicting with the traditional assumption of complete data availability. Even though a variety of different methods are available, it often remains unclear which of them is suitable for a specific task and how they perform in comparison to each other. We analyze the key properties of seven incremental methods representing different algorithm classes. Our extensive evaluation on data sets with different characteristics gives an overview of the performance with respect to accuracy as well as model complexity, facilitating the choice of the best method for a given application

    Dedicated Memory Models for Continual Learning in the Presence of Concept Drift

    Get PDF
    Losing V, Hammer B, Wersing H. Dedicated Memory Models for Continual Learning in the Presence of Concept Drift. Presented at the Continual Learning Workshop of the Thirtieth Annual Conference on Neural Information Processing Systems (NIPS), Barcelona

    Incremental on-line learning: A review and comparison of state of the art algorithms

    Get PDF
    Losing V, Hammer B, Wersing H. Incremental on-line learning: A review and comparison of state of the art algorithms. Neurocomputing. 2018;275:1261-1274

    Personalized Maneuver Prediction at Intersections

    Get PDF
    Losing V, Hammer B, Wersing H. Personalized Maneuver Prediction at Intersections. Presented at the IEEE Intelligent Transportation Systems Conference 2017, Yokohama

    Interactive Online Learning for Obstacle Classification on a Mobile Robot

    Get PDF
    Losing V, Hammer B, Wersing H. Interactive Online Learning for Obstacle Classification on a Mobile Robot. Presented at the International Joint Conference on Neural Networks, Killarney, Ireland.We present an architecture for incremental online learning in high-dimensional feature spaces and apply it on a mobile robot. The model is based on learning vector quantization, approaching the stability-plasticity problem of incremental learning by adaptive insertions of representative vectors. We employ a cost-function-based learning vector quantization approach and introduce a new insertion strategy optimizing a cost-function based on a subset of samples. We demonstrate this model within a real-time application for a mobile robot scenario, where we perform interactive real-time learning of visual categories

    The Courier, Volume 7, Issue 3, October 11, 1973

    Get PDF
    Stories: Tabisz Missing, Board Delays ‘Rep’ Decision If Law Enacted---Bonds May Finance New Student Center Students Get Pay Hike Here Record 9,996 Enrollment Reported for Fall Quarter Budget Cuts Begin Pinching Most Clusters TV Class Produce Own Shows People: Rick Tabisz Ed Dewell Jean Hatch Arlene Anderso

    2012 Rockin\u27 for the Troops_02

    Get PDF
    Photos by Jim Svehla/Special to College of DuPagehttps://dc.cod.edu/marcom-studentlife-events/1299/thumbnail.jp

    Tackling heterogeneous concept drift with the Self-Adjusting Memory (SAM)

    No full text
    Losing V, Hammer B, Wersing H. Tackling heterogeneous concept drift with the Self-Adjusting Memory (SAM). KNOWLEDGE AND INFORMATION SYSTEMS. 2018;54(1):171-201.Data mining in non-stationary data streams is particularly relevant in the context of Internet of Things and Big Data. Its challenges arise from fundamentally different drift types violating assumptions of data independence or stationarity. Available methods often struggle with certain forms of drift or require unavailable a priori task knowledge. We propose the Self-Adjusting Memory (SAM) model for the k-nearest-neighbor (kNN) algorithm. SAM-kNN can deal with heterogeneous concept drift, i.e., different drift types and rates, using biologically inspired memory models and their coordination. Its basic idea is to have dedicated models for current and former concepts used according to the demands of the given situation. It can be easily applied in practice without meta parameter optimization. We conduct an extensive evaluation on various benchmarks, consisting of artificial streams with known drift characteristics and real-world datasets. We explicitly add new benchmarks enabling a precise performance analysis on multiple types of drift. Highly competitive results throughout all experiments underline the robustness of SAM-kNN as well as its capability to handle heterogeneous concept drift. Knowledge about drift characteristics in streaming data is not only crucial for a precise algorithm evaluation, but it also facilitates the choice of an appropriate algorithm on real-world applications. Therefore, we additionally propose two tests, able to determine the type and strength of drift. We extract the drift characteristics of all utilized datasets and use it for our analysis of the SAM in relation to other methods
    corecore